A wide range of Azure data can be ingested into Splunk, including usage information and resource logs (HTTP, Console, Platform, etc). By ingesting these logs, resource data can be explored alongside usage data to identify the reasons behind unexpected costs. Additionally, in an organisational environment, by viewing this data in Splunk, dashboards can be created and modified for specific departments or people, ensuring they only have access to data relevant to their role.

In these articles, we will use the Splunk Add-on for Microsoft Cloud Services, which provides all the features necessary to ingest and search Azure data.

Add-on Installation

Installing the Splunk Add-On for Microsoft Cloud Services will differ depending on your Splunk environment; the only instance that requires the add-on is Search Heads, as it requires Microsoft Cloud Services knowledge management. Universal Forwarders are the only instance this add-on does not support, as it requires Python and the Splunk REST handler.

Single-Instanced Splunk Enterprise

See installation article.

Distributed Splunk Enterprise

See installation article; remember, this Add-On is not compatible with Universal Forwarders.

Add-on Setup

To use the APIs this add-on uses, there is some pre-requisite setup depending on the data we want to ingest.

Data to IngestRequired Component
Azure ResourceMicrosoft Entra Application
Azure AuditMicrosoft Entra Application
Azure Event HubMicrosoft Entra Application
Azure MetricsMicrosoft Entra Application
Azure KQL Log AnalyticsMicrosoft Entra Application
Azure Consumption (Billing)Microsoft Entra Application

Microsoft Entra Application

Required for the following resources:

  • Azure Resource
  • Azure Audit
  • Azure Event Hub
  • Azure Metrics
  • Azure KQL Log Analytics
  • Azure Consumption (Billing)

The add-on obtains data from the above resources using the Windows Azure Service Management APIs. To access these APIs, we must create a Microsoft Entra Application (previously called an Azure Active Directory Application) and then assign it a role so it has the correct permissions to access our resources.
Note – The permission Application.ReadWrite.All is required by the user registering the application and assigning it a role.

Creating the Application

  1. Sign in to the Microsoft Entra Admin Centre.
  2. Using the left navigation bar, head to Identity > Applications > App registrations then click New registration.
  3. Give the application a name; in this example, we used splunk-demo.
A screenshot of the Azure App Registrations window

We now have an Entra Application; we must assign it a role to use it.

Assigning Application’s Roles

  1. Go to the Azure Portal. We should still be logged in, but if not, log in.
  2. Head to Home > Subscriptions.
  3. Select the subscription we want to use from the Subscription Name column.
  4. Select Access control (IAM).
  5. Select Add, then Add Role Assignments.
  6. Within the role tab, select the Reader role.
    a. The reader role will cover most resources, but in order to read event hubs, we must also assign the permission Azure Event Hubs Data Receiver
  7. Select Next.
  8. Within the Members tab, select Assign access to, then select User, group, or service principal.
  9. Select Select members. By default, Microsoft Entra applications aren’t displayed in the available options. To find our application, Search for it by its name; in our case splunk-demo.
  10. Select the Select button, then select Review + assign.
    Now our application has the correct role, we must obtain the application credentials.

Obtaining Application Credentials

  1. Return to the Microsoft Entra Admin Centre. We should still be logged in, but if not, log in.
  2. Using the left navigation bar, head to Identity > Applications > App registrations, then click on your application from the Display name column.
  3. Within the Essentials dropdown, take note of:
    a. Application (client) ID – known as our Client ID.
    b. Directory (tenant) ID – known as our Tenant ID.
  4. Select Certificates & secrets.
  5. Select Client secrets then, + New client secret.
  6. Provide a description and a duration.
  7. Select Add.
  8. Note down the Value – known as our Key or Client Secret.
    a. Important – Note this down before leaving the page. This value is only displayed once.
    b. We can safely ignore Secret ID.

Our application credentials (known on the Add-on as our account attributes) should be as follows

Account Attributes

AttributeName in Splunk WebDescription
account_stanza_nameNameEnter a friendly name for your Azure app account. Account name cannot contain any whitespace.
client_idClient IDListed as Application (client) ID on Azure
client_secretKey (Client Secret)Listed as Value on Azure
tenant_idTenant IDListed as Directory (tenant) ID on Azure

Linking our Application to Splunk

We can now connect our application to Splunk using the credentials obtained above. This can be done through Splunk Web or Splunk’s configuration files.

Splunk Web
  1. Launch the add-on, then select Configuration.
  2. Select Azure App Account > Add Azure App Account.
  3. Enter a friendly Name for the account.
  4. Enter the Client IDKey (Client Secret) and Tenant ID using the following Account Attributes table above,
  5. Select Add.
Configuration Files
  1. Create or open $SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local/mscs_azure_accounts.conf.
  2. Add the following stanza using the Account Attributes table above:
    [<account_stanza_name>] client_id = <value> client_secret = <value> tenant_id = <value>

Ingesting Data

Now the add-on is installed and configured, we can begin ingesting our data into Splunk. For the purpose of this article, the add-on has just been installed on a Single Splunk Instance. In this article, we will go over Azure Consumption (billing) and Event Hubs.

Azure Consumption (Billing)

Here are the fields available for Azure Consumption and what they mean:

  • Name – A unique name for our input
  • Azure App Account – Select the Azure Application we made earlier
  • Subscription ID – The subscription we want to monitor. We can get this ID from:
    • Azure Portal
    • Home > Subscriptions > Select our Subscription from Subscription name column
    • Subscription ID
  • Data TypeUsage Details to collect usage details data or Reservation Recommendation to collect reservation recommendation data
  • Interval – How often to read usage data, default 86400
  • Index – The index our data will be stored in
  • Sourcetype – The sourcetype the ingested data will use, default mscs:consumption:billing for Usage Details and mscs:consumption:reservation:recommendation for Reservation Recommendation
  • Query Days – Specify the maximum number of days to query; the default is 10.
  • Start Date – Select a Start Date to specify how far back to go when initially collecting data, the default is 90. When the Usage Details data type is selected, the start date is used to calculate the Usage Details API query date range. The end date is the start date plus the number of days specified by Query Days. For example, if the start date is 2022-01-01T00:00:00 and Query Days is 10, the end date is 2022-01-11T00:00:00.

Splunk Web

  1. Launch the add-on, then select Inputs.
  2. Select Create New Input > Azure Consumption(Billing).
  3. Fill in the fields with the above data. See the image below as an example.
    The Azure Consumption Billing web configuration within Splunk
  4. Select Add.

Configuration Files

  1. In your Splunk platform deployment, navigate to:
    • `$SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local.
  2. Create a file named inputs.conf, if it does not already exist.
  3. Add the following stanza for consumption input:
    • Input configuration for the Usage Details data type
[mscs_azure_consumption://<input_stanza_name>]
account = <value>
data_type = Usage Details
index = <value>
interval = 86400
query_days = <value>
sourcetype = mscs:consumption:billing
start_date = <value>
subscription_id = <value>
  • Input configuration for Reservation Recommendation data type
[mscs_azure_consumption://<input_stanza_name>]
account = <value>
data_type = Reservation Recommendation
index = <value>
interval = 86400
sourcetype = mscs:consumption:reservation:recommendation
subscription_id = <value>

5. Save and restart the Splunk platform.

Azure Event Hubs

Configure your inputs using Splunk Web on the Splunk platform instance responsible for collecting data for this add-on, usually a heavy forwarder, in our case, we’re using a Single Splunk Instance.
Here are the fields available for Azure Event Hubs and what they mean:

  • Name – A unique name for our input.
  • Azure App Account – Select the Azure Application we made earlier.
  • Event Hub Namespace (FQDN) – Known as the Event Hub Namespaces “Host name”, for example, azure-to-splunk-demo.servicebus.windows.net. We can get this from:
  • Azure Portal
  • Home > Event Hubs > Select our Event Hub Namespaces from Name column
  • Host name
  • Event Hub Name – The name of the Event Hub within the Event Hub Namespace
  • Consumer Group – The Azure Event Hub Consumer Group.
  • Max Wait Time – The maximum interval in seconds the event processor will wait before processing. The default is 300 seconds.
  • Max Batch Size – The maximum number of events to retrieve in one batch. The default is 300.
  • Transport TypeAMQP over Websocket or AMQP.
  • Index – The index the data will be stored in.
  • Sourcetype – The sourcetype the ingested data will use, default mscs:azure:eventhub.
  • Interval – The number of seconds to wait before the Splunk platform reruns the command. The default is 3600 seconds.
  • Enable Blob Checkpoint Store – Enable storage blob as a checkpoint for event hub input. Using this will require two additional inputs:
    • Azure Storage Account – The Azure Storage account in which the container is created to store event hub checkpoints.
    • Container Name – Enter the container name under the storage account. You can only add one container name for each input.

Splunk Web

  1. Launch the add-on, then select Inputs.
  2. Select Create New Input > Azure Event Hub.
  3. Fill in the fields with the above data. See the image below as an example.
    The first screenshot of the Azure Event Hub input configuration in SplunkThe second screenshot of the Azure Event Hub input configuration in Splunk
  4. Select Add.

Configuration Files

  1. In your Splunk platform deployment, navigate to:
    • $SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local.
  2. Create a file named inputs.conf, if it does not already exist.
  3. Add the following stanza for the Event Hub input:
[<input_stanza_name>]              
account = <value>
blob_checkpoint_enabled = <value>
storage_account = <value>
container_name = <value>
consumer_group = <value>
event_hub_name = <value>
event_hub_namespace = <value>
container_name = <value>
index = <value>
interval = <value>
max_batch_size = <value>
max_wait_time = <value>
use_amqp_over_websocket = 1
sourcetype = mscs:azure:eventhub
  1. Save and restart the Splunk platform.
Posted by:Lukas Smith